Support Vector Machines with Sparse Binary High-Dimensional Feature Vectors
نویسندگان
چکیده
We introduce SparseMinOver, a maximum margin Perceptron training algorithm based on the MinOver algorithm that can be used for SVM training when the feature vectors are sparse, high-dimensional, and binary. Such feature vectors arise when the CRO feature map is used to map the input space to the feature space. We show that the training algorithm is efficient with this type of feature vector, while preserving the accuracy of the underlying SVM. We demonstrate the accuracy and efficiency of this technique on a number of datasets, including TIMIT, for which training a standard SVM with RBF kernel is prohibitively expensive. SparseMinOver relies on storing large indices and is particularly suited to large memory machines. External Posting Date: March 18, 2016 [Fulltext] Internal Posting Date: March 18, 2016 [Fulltext] Copyright 2016 Hewlett Packard Enterprise Development LP Support Vector Machines with Sparse Binary High-Dimensional Feature Vectors Kave Eshghi Hewlett Packard Labs 1501 Page Mill Rd. Palo Alto, CA 94304 [email protected] Mehran Kafai Hewlett Packard Labs 1501 Page Mill Rd. Palo Alto, CA 94304 [email protected]
منابع مشابه
Least Squares Support Vector Machines and Primal Space Estimation
In this paper a methodology for estimation in kernel-induced feature spaces is presented, making a link between the primal-dual formulation of Least Squares Support Vector Machines (LS-SVM) and classical statistical inference techniques in order to perform linear regression in primal space. This is done by computing a finite dimensional approximation of the kernel-induced feature space mapping ...
متن کاملCROification: Accurate Kernel Classification with the Efficiency of Sparse Linear SVM
Kernel methods have been shown to be effective for many machine learning tasks such as classification and regression. In particular, support vector machines with the Gaussian kernel have proved to be powerful classification tools. The standard way to apply kernel methods is to use the kernel trick, where the inner product of the vectors in the feature space is computed via the kernel function. ...
متن کاملTrading Accuracy for Size: Online Small SVMs via Linear Independence in the Feature Space
Support Vector Machines (SVMs) are a machine learning method rooted in statistical learning theory. One of their most interesting characteristics is that the solution achieved during training is sparse, meaning that a few samples are usually considered “important” by the algorithm (the so-called support vectors) and give account of most of the complexity of the classification/regression task. I...
متن کاملFacial expression recognition based on Local Binary Patterns
Classical LBP such as complexity and high dimensions of feature vectors that make it necessary to apply dimension reduction processes. In this paper, we introduce an improved LBP algorithm to solve these problems that utilizes Fast PCA algorithm for reduction of vector dimensions of extracted features. In other words, proffer method (Fast PCA+LBP) is an improved LBP algorithm that is extracted ...
متن کاملAnomaly Detection Using SVM as Classifier and Decision Tree for Optimizing Feature Vectors
Abstract- With the advancement and development of computer network technologies, the way for intruders has become smoother; therefore, to detect threats and attacks, the importance of intrusion detection systems (IDS) as one of the key elements of security is increasing. One of the challenges of intrusion detection systems is managing of the large amount of network traffic features. Removing un...
متن کامل